To Each According to its Need: Kernel Class Specific Classifiers
نویسندگان
چکیده
We present in this paper a new multi-class Bayes classifier that permits using separate feature vectors, chosen specifically for each class. This technique extends previous work on feature Class Specific Classifier to kernel methods, using a new class of Gibbs probability distributions with nonlinear kernel mapping as energy function. The resulting method, that we call Kernel Class Specific Classifier, permits using a different kernel and a different feature set for each class. Moreover, the proper kernel for each class can be learned by the training data with a leave-one-out technique. This removes the ambiguity regarding the proper choice of the feature vectors for a given class. Experiments on appearance-based object recognition show the power of the proposed approach.
منابع مشابه
A Simple High Performance Approach to Semantic Segmentation
We propose a simple approach to semantic image segmentation. Our system scores low-level patches according to their class relevance, propagates these posterior probabilities to pixels and uses low-level segmentation to guide the semantic segmentation. The two main contributions of this paper are as follows. First, for the patch scoring, we describe each patch with a high-level descriptor based ...
متن کاملImproving Kernel Classifiers for Object Categorization Problems
This paper presents an approach for improving the performance of kernel classifiers applied to object categorization problems. The approach is based on the use of distributions centered around each training points, which are exploited for inter-class invariant image representation with local invariant features. Furthermore, we propose an extensive use of unlabeled images for improving the SVMba...
متن کاملClass Specific Object Recognition using Kernel Gibbs Distributions
Feature selection is crucial for effective object recognition. The subject has been vastly investigated in the literature, with approaches spanning from heuristic choices to statistical methods, to integration of multiple cues. For all these techniques the final result is a common feature representation for all the considered object classes. In this paper we take a completely different approach...
متن کاملOn the construction of extreme learning machine for online and offline one-class classification - An expanded toolbox
One-Class Classification (OCC) has been prime concern for researchers and effectively employed in various disciplines. But, traditional methods based one-class classifiers are very time consuming due to its iterative process and various parameters tuning. In this paper, we present six OCC methods and their thirteen variants based on extreme learning machine (ELM) and Online Sequential ELM (OSEL...
متن کاملVoting Principle Based on Nearest kernel classifier and Naive Bayesian classifier
This paper presented a voting principle based on multiple classifiers. This voting principle was based on the naïve Bayesian classification algorithm and a new method based on nearest to class kernel classifier that was proposed. The recognition ability of each classifier to each sample is not the same. A model of each classifier was obtained by the training on the train data, which acts as bas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002